Coreper Posted August 18, 2008 Report Share Posted August 18, 2008 does anyone know an application which can track hyperlink in a webpage and save all the target files from the hyperlink to a certain folder?please let me knowCoreper Quote Link to comment Share on other sites More sharing options...
ɹəuəllıʍ ʇɐb Posted August 19, 2008 Report Share Posted August 19, 2008 wget Quote Link to comment Share on other sites More sharing options...
Coreper Posted August 19, 2008 Author Report Share Posted August 19, 2008 that program isnt compatible with windows...is there another one who is compatible with windows?or another similiair program for windows? Quote Link to comment Share on other sites More sharing options...
ɹəuəllıʍ ʇɐb Posted August 19, 2008 Report Share Posted August 19, 2008 wget is available for a large number of platforms, including Windows. The latest version is 11.4 [corr. 1.11.4]. Quote Link to comment Share on other sites More sharing options...
Coreper Posted August 19, 2008 Author Report Share Posted August 19, 2008 i guess you mean this> http://www.christopherlewis.com/WGet/WGetFiles.htm ?then maybe you can also tell me how it works... Quote Link to comment Share on other sites More sharing options...
ɹəuəllıʍ ʇɐb Posted August 20, 2008 Report Share Posted August 20, 2008 How it works: http://www.gnu.org/software/wget/manual/wget.html Quote Link to comment Share on other sites More sharing options...
Coreper Posted August 20, 2008 Author Report Share Posted August 20, 2008 so all i have to do is this?:-create a html document with the links [easy & quick]-run wget with this in commandline: wget -i links.htmlbut how do i specifiy where it should save the target files?EDIT:lets say i want the files stored in F:\Downloads\, and a html file with the links is called links.html, and it is located in the same directory as the wget programhow do i get wget to save all the link targets from the links.html file to the downloadfolder > what commands should i be giving to wget? Quote Link to comment Share on other sites More sharing options...
Scarecrow Man Posted August 20, 2008 Report Share Posted August 20, 2008 DownThemAll - DownThemAll lets you download all the links or images contained in a webpage.Flash Got - FlashGot is the free Mozilla / Firefox / Flock / Thunderbird extension (compatible with Netscape too), meant to handle single and massive ("all" and "selection") downloads with several external Download Managers. Both are Firefox extensions. Quote Link to comment Share on other sites More sharing options...
Coreper Posted August 20, 2008 Author Report Share Posted August 20, 2008 do you also know not-firefox addons? like ie-addons or just a regular program? Quote Link to comment Share on other sites More sharing options...
Scarecrow Man Posted August 20, 2008 Report Share Posted August 20, 2008 wget :)orhttp://www.freedownloadmanager.org/ for example. Most download managers can do this. Quote Link to comment Share on other sites More sharing options...
ɹəuəllıʍ ʇɐb Posted August 21, 2008 Report Share Posted August 21, 2008 but how do i specifiy where it should save the target files?wget is easiest run as a batch file script (.BAT); just cd to the target folder before running wget. Quote Link to comment Share on other sites More sharing options...
Coreper Posted August 21, 2008 Author Report Share Posted August 21, 2008 i found a more easier and understandable [for me] way:-i made an html file with the links [which have text, instead of the linkadress]-i used LinksExtractor [freeware] to extract the links and save them to clipboard-then i placed them in a html file, changed a few codes using Frontpage [it has got some text replacement options]-then i used Free Download Manager to import that list of links, and it started downloading all the filesthanks for the help Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.